Skip to main content

Full text of "DTIC ADA476557: 2006 Survey of Reserve Component Spouses. Administration, Datasets, and Codebook"

See other formats


Information  and  Technology  for  Better  Decision  Making 

2006  Survey  of  Reserve 
Component  Spouses 

Administration,  Datasets,  and  Codebook 


Additional  copies  of  this  report  may  be  obtained  from: 
Defense  Technical  Infonnation  Center 
ATTN:  DTIC-BRR 
8725  John  J.  Kingman  Rd.,  Suite  #0944 
Ft.  Belvoir,  VA  22060-6218 
Or  from: 


http://www.dtic.mil/dtic/order.html 
Ask  for  report  by 


DMDC  Report  No.  2006-030 
March  2007 


2006  SURVEY  OF  RESERVE  COMPONENT 

SPOUSES: 

ADMINISTRATION,  DATASETS,  AND  CODEBOOK 


Defense  Manpower  Data  Center 
Survey  &  Program  Evaluation  Division 
1600  Wilson  Boulevard,  Suite  400,  Arlington,  VA  22209-2593 


Table  of  Contents 


Acknowledgments 

Defense  Manpower  Data  Center  (DMDC)  is  indebted  to  numerous  people  for  their 
assistance  with  the  2006  Survey  of  Reserve  Component  Spouses  (2006  RCSS '),  which  was 
conducted  on  behalf  of  the  Office  of  the  Under  Secretary  of  Defense  for  Personnel  and 
Readiness  (OUSD[P&R]).  The  survey  program  is  conducted  under  the  leadership  of  Timothy 
Elig,  Chief  of  the  Survey  and  Program  Evaluation  Division. 

Policy  officials  contributing  to  the  development  of  this  survey  included:  John  Winkler, 
Wayne  Spruell,  Tom  Bush,  Virginia  Hyland,  Dan  Kohner,  Richard  Krimmer,  James  Scott,  Col 
Kathleen  Woody  (USAFR),  and  Col  Nilda  Urrutia  (ANG),  all  from  the  Office  of  the  Assistant 
Secretary  of  Defense  for  Reserve  Affairs.  Other  contributing  officals  included  Jane  Burke,  Cathy 
Flynn,  and  Lin  Porter  (Military  Community  and  Family  Policy).  Howard  Weiss  of  Purdue 
University  was  also  an  important  contributor. 

DMDC’s  Program  Evaluation  Branch,  under  the  guidance  of  Brian  Lappin,  Branch  Chief, 
is  responsible  for  the  development  of  questionnaires  used  in  the  survey  program.  The  lead 
developer  on  this  survey  was  Rachel  Lipari.  She  was  supported  in  these  efforts  by  Lindsay 
Rock,  DMDC,  and  Megan  Shaw  and  Kristin  Olson,  Consortium  Research  Fellows. 

DMDC's  Survey  Technology  Branch,  under  the  guidance  of  James  Caplan,  Branch  Chief, 
is  responsible  for  monitoring  survey  administration  and  survey  database  construction.  The  lead 
analyst  on  this  survey  was  Margaret  Emma  Holland,  SRA  International,  Inc.  She  was  supported 
by  Matthew  Perry  Consortium  Research  Fellow.  Data  Recognition  Corporation  (DRC) 
performed  data  collection  and  editing. 

DMDC’s  Personnel  Survey  Branch,  led  by  Richard  Riemer,  former  Branch  Chief,  and 
Jean  Fowler,  current  Branch  Chief,  is  responsible  for  the  sampling  and  weighting  methods  used 
in  the  survey  program.  Michael  Paraloglou,  SRA  International,  Inc.,  used  the  DMDC  Sampling 
Tool  to  plan  the  sample  and  developed  the  weights  for  this  survey.  He  also  used  DMDC’s 
Statistical  Analysis  Macros  to  calculate  the  estimates  presented  in  this  tabulation  volume.  Susan 
Reinhold  and  Carole  Massey,  DMDC,  and  Deborah  West,  Northrup  Grumman  Corporation, 
provided  programming  support  for  the  sampling  and  weighting  tasks. 


IV 


Table  of  Contents 


Page 

Introduction . 7 

Overview  of  Report . 7 

Method . 8 

Survey  Instrument . 8 

Sample . 10 

Respondents . 13 

Survey  Development  and  Administration . 16 

Address  Update  Procedures . 21 

Processing  of  Updates . 23 

Survey  Materials  and  Their  Distribution . 24 

Processing  Returned  Surveys . 27 

Survey  Analysis  Files . 29 

Estimation . 29 

Data  Structure . 30 

Variables  in  the  Survey  Analysis  Files . 3 1 

Using  H . 34 

References . 38 

Appendix 

A.  2006  Survey  of  Reserve  Component  Spouses:  Paper  Fonn . A-l 

B.  2006  Survey  of  Reserve  Component  Spouses:  Web  Form . B-l 

C.  Communications . C-l 

D.  Annotated  Web  Survey  Form . D-l 

E.  Coding  Scheme . E-l 

F.  Alphabetical  Variable  List  for  the  Survey  Analysis  Files . F- 1 

G.  Positional  Variable  List  for  the  Survey  Analysis  Files . G- 1 

H.  Frequency  and  Percentage  Distributions  for  Variables  in  the  Survey  Analysis  Files . H-l 

I.  Flat  File  Layout  for  the  Basic -Release  Data  File . I- 1 

J.  Notes  on  Analysis . J-l 

K.  Examples  of  Analysis . K-l 

L.  Crosswalk  of  Web  Instrument  to  Paper  Instrument . L- 1 

M.  Crosswalk  of  RCSS  to  Previous  Reserve  Component  Member  and  Spouse  Surveys . M-l 


v 


Table  of  Contents 


List  of  Tables 

1 .  Member  Stratification  Variables . 1 1 

2.  Factors  Defining  Key  Reporting  Domains  (Member) . 12 

3.  Sample  Allocation  for  the  2006  Survey  of  Reserve  Component  Spouses  by  Member 

Characteristics . 13 

4.  Final  Sample  Relative  to  Drawn  Sample . 14 

5 .  Location  Rates,  Response  Rates,  and  Completion  Rates . 15 

6.  Mailing  Timeline  and  Return  Results . 1 8 

7.  E-mail  Address  Availability  by  Service . 26 

8.  Analysis  File  Names . 30 

List  of  Figures 

1 .  Survey  Control  System . 19 

2.  Address  Updating  Procedures . 22 

3.  The  Structure  of  the  Confidential  File . 31 

4.  Annotated  Example  of  a  Table  from  G . 35 


2006  SURVEY  OF  RESERVE  COMPONENT  SPOUSES: 
ADMINISTRATION,  DATASETS,  AND  CODEBOOK 

Introduction 

The  Human  Resources  Strategic  Assessment  Program  (HRSAP),  Defense  Manpower 
Data  Center  (DMDC),  conducts  both  Web-based  and  paper-and-pencil  surveys  to  support  the 
personnel  information  needs  of  the  Under  Secretary  of  Defense  for  Personnel  and  Readiness 
[USD(P&R)].  These  surveys  assess  the  attitudes  and  opinions  of  the  entire  Department  of 
Defense  (DoD)  community  on  a  wide  range  of  personnel  issues.  A  Web-based  survey  program 
with  postal-  and  e-mail  notification,  known  as  the  Status  of  Forces  Surveys  (SOFS),  provides 
data  several  times  per  year  on  active-duty  and  Reserve  component  members  and  DoD  civilian 
employees.  Paper-and-pencil  surveys  with  postal-  and  e-mail  notification  are  used  to  obtain  data 
on  sensitive  topics  (e.g.,  sexual  harassment)  and  from  populations  who  may  have  limited  Internet 
access  (e.g.,  spouses  of  active  and  Reserve  members). 

The  2006  Survey  of  Reserve  Component  Spouses  (2006  RCSS)  utilized  both  modes  of 
administration — the  Web  as  well  as  paper- and -pen — and  was  designed  to  assess  the  attitudes  and 
opinions  of  active-duty  spouses  on  a  wide-range  of  quality  of  life  issues.  Data  were  collected  by 
mail  and  Web,  between  November  2005  and  June  2006. 1  The  sample  consisted  of  38,549 
Reserve  Component  spouses.  A  total  of  1 1,001  eligible  spouses  returned  usable  surveys,  which 
represent  an  adjusted  weighted  response  rate  of  3 1 .4%. 

Overview  of  Report 

DMDC  (2006a)  provides  details  on  sampling  and  weighting. 

This  report  also  documents  the  procedures  used  to  develop  the  instrument,  design  the 
sample,  conduct  the  survey,  process  the  data  and  prepare  analysis  weights.  Along  with  the 
survey  instrument  and  communications  to  the  sample  members  (A,  B  and  C,  respectively),  the 
methods  section  includes  details  on  how  the  survey  was  conducted. 

Following  the  summary  of  the  survey  methodology  is  a  description  of  the  survey  analysis 
file  layout  and  key  variables.  Appendix  D-M  address  key  concepts  required  for  the  analysis  of 
complex  survey  data  and  the  structure  of  records  in  the  survey  analysis  files  are  introduced  in 
this  section.  The  appendix  in  this  report  include: 

•  A  and  B  -  Web  and  paper  survey  instruments. 

•  C  -  Samples  of  all  possible  communications  sent  to  sample  members  during  the 
survey  administration:  letters,  emails,  and  brochure. 


1  The  initial  survey  field  period  closed  February  9,  2006.  There  were  3,091  spouses  incorrectly  flagged  as 
population  ineligible  during  the  original  field  period.  DMDC  elected  to  re-open  the  field  from  May  1-  June  1,  2006 
to  give  them  an  opportunity  to  participate. 


7 


•  Conventions  for  variable  naming  and  construction  are  provided  in  D  (annotated 
questionnaire)  and  E  (coding  scheme), 

•  F,  G,  and  H  list  the  names  and  values  of  all  variables  in  the  basic-survey  dataset  and 
the  Privacy-Act  confidential  variables. 

—  F  lists  the  variables  in  alphabetic  order  and  flags  the  Privacy-Act  confidential 
variables  with  an  asterisk  (*). 

—  G  lists  the  variables  in  the  order  that  they  appear  in  the  dataset.  Variables  with 
the  same  function  are  grouped  together,  (i.e.,  all  variables  used  for  weighting  are 
located  together). 

—  H  provides  a  frequency  for  each  variable  with  the  SAS  values,  OS  flat  file 
values  and  SAS  labels  in  the  order  that  the  variables  appear  in  the  dataset.  In 
addition  to  the  variables  available  on  the  basic-survey  file,  H  contains  details  for 
the  confidential  variables  that  had  to  be  suppressed  to  preserve  the  privacy  of 
survey  respondents  and  nonrespondents. 

•  I  provides  the  record  layout  for  the  basic-survey  flat  file. 

•  The  SAS  code  used  to  construct  the  analytic  variables  are  included  in  J. 

•  Examples  of  analyses  are  provided  in  K. 

•  F  and  M  lists  all  questionnaire  items  and  identifies  where  they  have  been  used  in 
previous  DMDC  surveys  of  active-duty  members  or  spouses. 

Method 

Survey  Instrument 

A  copy  of  the  2006  RCSS  Web  and  paper  questionnaires  is  provided  in  A  and  B.  The 
survey  was  subdivided  into  the  following  seventeen  topic  areas: 

1 .  Background  Information — Member’s  active-duty  background  and  total  years  of 
military  service;  spouse  military  ID  card  and  enrollment  in  DEERS;  and  spouse 
characteristics,  including  education,  personal  goals,  race/ethnicity,  U.S.  citizenship, 
English  as  a  second  language,  age,  and  personal  experiences  with  the  military. 

2.  Housing — Distance  to  nearest  military  installation  and  problems  in  gaining  access  to 
installation. 


2  SAS®  is  a  registered  trademark  of  SAS  Institute  Inc.,  Cary,  NC,  USA. 

3  The  OS  flat  fde  is  a  text  version  of  the  dataset.  The  variables  are  in  the  columns  and  the  records  are  in  the  rows. 
This  data  can  be  loaded  into  any  statistical  software  package. 


8 


3.  Your  Spouse’s  Activations/Deployments — Member’s  time  away  from  home  and 
characteristics  of  activations  over  the  past  24  months,  including  duration,  volunteer 
status,  deployment,  and  location. 

4.  Your  Spouse’s  Activations/Deployments  Since  September  11,  2001 —  Member’s 
activation  and  deployment  history  since  September  11,  2001,  and  current 
activation/deployment  status;  number  of  activations;  spouse  preparedness  for 
activation;  advance  notification;  household  income  before,  during,  and  following 
activation;  health  and  dental  care  coverage;  impact  of  activation  on  financial  well¬ 
being;  number  of  hours  spouse  worked;  number,  length,  and  location  of  deployment s); 
means  of  communicating  with  member;  coping;  member’s  post-deployment  behavior; 
support  services  received  by  member;  spouse  preparation  for  member’s  return; 
difficulty  of  spouse’s  readjustment  to  member’s  return;  interaction  with  military  point 
of  contact;  and  military-provided  support  for  families. 

5.  Effect  of  Deployments  on  Children —  Emotional/behavioral  impact  of  deployment  on 
children  and  ways  children  cope  with  deployments. 

6.  Activation/Deployment  Expectations — Reduced  and  extended  lengths  of 
activation/deployment,  member  leaving  sooner  than  expected,  and  member  having  less 
time  between  activations/deployments  than  expected. 

7.  Preparedness —  Spouse’s  awareness  of  and  access  to  important  family  documents 
during  deployments,  as  well  as  financial  steps  taken  to  prepare  for  deployments. 

8.  Feelings  About  the  National  Guard/Reserves — Overall  satisfaction  with  the  National 
Guard/Reserve  way  of  life;  support  for  member’s  participation;  factors  affecting 
spouse  support;  member’s  National  Guard/Reserve  career  plans;  likelihood  of 
activation/deployment  in  the  next  year;  and  spouse  commitment  to  member’s  staying 
in  the  National  Guard/Reserve. 

9.  Marital  History —  Military  membership  at  time  of  marriage,  years  married,  satisfaction 
with  marital  relationship,  and  change  in  frequency  of  problems  with  relationship  over 
past  year. 

10.  Children  and  Legal  Dependents —  Number  of  children  and  legal  dependents  living 
inside  or  outside  the  home. 

1 1 .  Child  Care — Use  of  child  care,  primary  source  of  child  care,  monthly  cost  of  child 
care,  days  of  work  missed  because  of  lack  of  child  care,  and  the  impact  of  child  care 
issues  on  member  staying  in  the  National  Guard/Reserve. 

12.  Elder  Care — Number  of  elderly  family  members  receiving  care  from  spouse,  whether 
any  elderly  family  members  live  with  spouse,  and  amount  of  care  giving  required. 

13.  Employment —  Spouse  employment  status  and  history,  reasons  for  not  looking  for 
work,  hours  worked  per  week,  reasons  for  working  part-time,  characteristics  of 
principal  employment,  and  shiftwork. 


9 


14.  Financial  Well-Being —  Financial  goals,  contributions  of  member’s  National 
Guard/Reserve  and  spouse’s  income  to  total  household  income,  financial  problems 
experienced,  saving  habits,  and  general  financial  condition. 

15.  Health  and  Well-Being — Perceptions  of  stress  and  social  support. 

16.  Programs  and  Services — Use  of,  and  satisfaction  with,  military-provided  programs  and 
services;  the  most  likely  way  to  leam  about  and  use  support  programs  and  services;  use 
of  Military  OneSource  and  the  primary  reason  for  not  using  it;  use  of,  and  satisfaction 
with,  TRICARE  programs  and  reasons  for  not  using  them;  and  comparisons  between 
TRICARE  and  civilian  health  and  dental  plans. 

17.  Communicating  with  You  (About  Survey) — Preference  for  Web  versus  paper  surveys 
and  reasons  for  not  completing  the  survey  on  the  Web. 

Sample 

The  target  population  for  the  2006  RCSS  consists  of  spouses  of  Reserve  component 
members  from  the  Selected  Reserve  in  Reserve  Unit,  Active  Guard/Reserve  (AGR/FTS/AR;4 
Title  10  and  Title  32),  Individual  Mobilization  Augmentee  (IMA)  programs  from  the  Army 
National  Guard  (ARNG),  U.S.  Army  Reserve  (USAR),  U.S.  Navy  Reserve  (USNR),  U.S. 

Marine  Corps  Reserve  (USMCR),  Air  National  Guard  (ANG),  and  U.S.  Air  Force  Reserve 
(USAFR),  who  (1)  have  at  least  six  months  of  service  at  the  time  the  questionnaire  is  first  fielded 
and  (2)  are  below  flag  rank.  In  addition,  at  the  time  of  the  survey,  for  the  spouse  to  remain 
eligible  they  must  have  indicated  being  currently  married  to  a  Reserve  component  member.  A 
Reserve  component  member  married  to  another  Reserve  component  member  would  be  eligible 
for  the  survey  depending  on  their  spouse’s  status,  not  their  own.  The  sample  consisted  of  38,549 
individuals;  1 1,001  ultimately  provided  usable  survey  responses. 

Constructing  the  Frame  and  Drawing  the  Sample 

The  sample  frame  was  constructed  from  DMDC’s  March  2005  Reserve  Components 
Common  Personnel  Data  System  (RCCPDS)  and  July  2005  Defense  Enrollment  Eligibility 
Reporting  System  (DEERS)  Medical  Point-in-Time  Extract  (PITE)  if  the  spouses  were  also 
eligible  for  benefits.  The  actual  source  infonnation  for  constructing  the  sampling  frame  and 
identifying  key  domains  consisted  of  a  computer  accessible  file  totaling  418,276  spouse  records. 
The  sample  drawn  from  the  sampling  frame  consisted  of  38,549  individuals.  Table  3  presents  a 
summary  of  the  sample  allocation  by  Service. 

Stratification  Variables 

The  frame  was  stratified  (divided  into  mutually  exclusive  population  groups)  for 
sampling  using  the  six  variables  listed  in  Table  1. 


4  Names  for  this  program  vary  among  Reserve  components:  AGR/FTS/AR  is  a  combination  of  Active 
Guard/Reserve  (AGR),  Full-Time  Support  (FTS),  and  Active  Reserve  (AR). 


10 


Table  1. 

Member  Stratification  Variables 


Dimension  of  Stratification 

Levels 

Active  Status  During  Prior  24  Months 

Not  active  in  prior  24  months 

Active  SOC  in  prior  24  months 

De-activated  in  prior  23  months 

Reserve  Component 

Army  National  Guard 

US  Army  Reserve 

US  Naval  Reserve 

US  Marine  Corps  Reserve 

Air  National  Guard 

US  Air  Force  Reserve 

Reserve  Program 

TPU/Unknown 

AGR  10 

AGR32 

Military  Technicians 

IMA 

Paygrade  Group  7 

E1-E3 

E4/Enlisted  Unknowns 

E5-E6 

E7-E9 

W1-W5 

01-03/0fficer  Unknown 

04-06 

Race/Ethnic  Category 

Non-minority 

Minority 

Gender 

Male/Unknown 

Female 

Researchers  identified  population  subgroups  of  particular  interest  to  policy  officials. 
These  reporting  domains  were  defined  using  the  demographic  variables  shown  in  Table  2. 
Multiple  versions  of  most  of  these  variables  were  created  to  permit  varying  levels  of  detail  for 
analysis  and  reporting. 

The  sample  size  and  allocation  were  detennined  using  the  DMDC  Sample  Planning  Tool 
(Deever  &  Mason,  2002).  The  Tool  uses  a  formal  mathematical  procedure  (Chromy,  1987)  to 
determine  the  minimum  cost  (i.e.,  minimum  size)  allocation  that  meets  precision  requirements 
(e.g.,  ±  5  percentage  points)  imposed  on  prevalence  estimates  for  key  reporting  domains. 


11 


Table  2. 

Factors  Defining  Key  Reporting  Domains  (Member) 


Factor 

Levels 

Active  Prior  24  months  and  De-active  Special  Not  active  in  prior  24  Months 

Operations  Code  Prior  23  months 

Active  SOC  in  prior  24  Months 

De-activated  in  prior  23  Months 

Active  Special  Operations  Code  on  Prior  13 

No  ASOC  13-24  Months 

to  24  Months 

ASOC  13-24  Months 

Active  Special  Operations  Code  on  Prior  12 

No  ASOC  1-12  Months 

Months 

ASOC  1-12  Months 

Active  Special  Operations  Codes  on  Prior  24  Noble  Eagle 

Months 

Enduring  Freedom 

Iraqi  Freedom  (SOFR0309) 

Reserve  Component 

U.S.  Army  National  Guard 

U.S.  Army  Reserve 

U.S.  Naval  Reserve 

U.S.  Marine  Corps  Reserve 

Air  National  Guard 

U.S.  Air  Force  Reserve 

Component 

Reserves 

National  Guard 

Pay  Grade  Group 

E1-E3 

E4 

E5-E6 

E7-E9 

W1-W5 

01-03 

04-06 

Program 

TPU 

AGR  10 

AGR32 

MILTECH 

IMA 

Race-ethnic 

Non-minority 

Non-Hispanic  Black 

Hispanic 

Other  Race 

Within  each  stratum,  the  sample  was  selected  with  equal  probability  and  without 
replacement.  Sampling  rates  varied  across  the  strata,  so  individuals  were  not  selected  with  equal 


12 


probability  overall.  Table  3  presents  a  summary  of  the  sample  allocation  for  the  total  population 
and  by  gender,  paygrade  group,  race/ethnicity,  geographic  region,  and  family  status  by  Service. 


Table  3. 

Sample  Allocation  for  the  2006  Survey  of  Reserve  Component  Spouses  by  Member 
Characteristics 


Sample 

Total 

Army 

National 

Guard 

Army 

Reserve 

Naval 

Reserve 

Marine 

Corps 

Reserve 

Air 

National 

Guard 

Air  Force 
Reserve 

Total 

38,549 

6,700 

7,105 

5,570 

7,141 

6,749 

5,284 

Activated/Deactivated 

Not  active  in  prior  24  months 

22,108 

3,350 

3,936 

4,465 

2,407 

4,148 

3,802 

Active  SOC  in  prior  24  months 

515 

100 

56 

25 

191 

111 

32 

De-activated  in  prior  23  months 

15,926 

3,250 

3,113 

1,080 

4,543 

2,490 

1,450 

Activated/Not  Activated 

No  Active  SOC  lto  24  months 

22,106 

3,349 

3,936 

4,465 

2,406 

4,148 

3,802 

Active  SOC  1  to  24  months 

16,443 

3,351 

3,169 

1,105 

4,735 

2,601 

1,482 

Race  Ethnic  Category’  2 

Unknown 

1,087 

73 

98 

360 

304 

106 

146 

Non-minority 

24,423 

4,148 

3,757 

3,247 

4,710 

5,071 

3,490 

Minority 

13,039 

2,479 

3,250 

1,963 

2,127 

1,572 

1,648 

Gender 

Male 

31,209 

5,887 

5,443 

4,386 

6,574 

5,255 

3,664 

Female 

7,340 

813 

1,662 

1,184 

567 

1,494 

1,620 

Pay  Grade  Group 

E1-E3 

3,377 

458 

710 

501 

1,135 

273 

300 

E4/Unknown 

8,962 

1,888 

1,873 

1,118 

899 

1,689 

1,495 

E5-E6 

7,978 

1,295 

1,650 

1,092 

1,373 

1,853 

715 

E7-E9 

3,750 

384 

842 

229 

1,026 

785 

484 

W1-W5 

1,328 

720 

249 

62 

297 

0 

0 

01-03 

5,607 

1,055 

989 

1,156 

332 

956 

1,119 

04-06 

7,547 

900 

792 

1,412 

2,079 

1,193 

1,171 

Reserve  Program 

Unknown 

1,101 

267 

144 

409 

66 

116 

99 

TPU 

28,923 

5,370 

5,978 

4,178 

5,213 

4,654 

3,530 

AGR/TAR 

3,609 

562 

473 

948 

819 

682 

125 

Military  Technicians 

2,442 

501 

261 

0 

0 

1,297 

383 

IMA 

2,474 

0 

249 

35 

1,043 

0 

1,147 

Note.  Counts  for  unknowns  are  may  not  be  included. 


13 


Respondents 


Sample  Losses 

The  original  sample  file  contained  38,549  records.  Losses  to  the  drawn  sample  are  listed 
in  Table  4  and  reviewed  here.  Sample  members  were  lost  from  the  sample  for  three  main 
reasons:  (1)  self-reported  or  other  ineligibility  for  the  survey,  (2)  an  inability  to  locate  the  sample 
member,  and  (3)  refusal  to  participate  in  the  survey  or  other  failure  to  respond  to  the  survey. 

A  total  of  2,340  sample  members  (9.31%)  were  lost  from  the  final  sample  through 
classification  as  ineligible.  Elimination  of  ineligibles  resulted  in  decreasing  the  sample  to 
90.69%  (N=34,961)  of  its  original  size. 


Table  4. 

Final  Sample  Relative  to  Drawn  Sample 


Sample  counts 

Weighted  estimates  of 
population 

n 

% 

n 

% 

Drawn  sample 

38,549 

418,267 

Ineligible  on  master  files 

-1935 

5.02% 

-19,170 

4.58% 

Self-reported  ineligible 

-1653 

4.29% 

-17,560 

4.20% 

Total:  Ineligible 

-3,588 

9.31% 

-36,730 

8.78% 

Eligible  sample 

34,961 

90.69% 

381,537 

91.22% 

Not  located  (estimated  ineligible) 

-164 

0.43% 

-1408 

0.34% 

Not  located  (estimated  eligible) 

-1,226 

3.18% 

-11,819 

2.83% 

Total  not  located 

-1,390 

3.61% 

-13,227 

3.16% 

Located  sample 

33,571 

87.09% 

368,310 

88.06% 

Requested  removal  from  survey  mailings 

-138 

0.36% 

-1,561 

0.37% 

Returned  blank 

-220 

0.57% 

-2,157 

0.52% 

Skipped  key  questions 

-981 

2.54% 

-12,508 

2.99% 

Did  not  return  a  survey  (estimated  ineligible) 

-2508 

6.51% 

-23,505 

5.62% 

Did  not  return  a  survey  (estimated  eligible) 

-18,723 

48.57% 

-197,363 

47.19% 

Total:  Nonresponse 

-22,570 

58.55% 

-237,094 

56.68% 

Usable  responses 

11,001 

28.54% 

131,216 

31.37% 

In  general,  spouses’  residential  addresses  were  used  as  the  primary  addresses  of  choice, 
followed  by  the  members’  residential  addresses.  In  cases  where  residential  addresses  could  not 
be  identified,  however,  member  unit  addresses  were  used.  Procedures  used  to  locate  spouse  of 
Reserve  Component  members  are  explained  in  a  later  section  that  describes  the  Survey  Control 
System.  Because  of  this  address  update  procedure,  less  than  3.61%  of  the  drawn  sample  (1,390 
of  38,549)  was  lost  because  the  sample  members  could  not  be  located.  Personnel  records  for  this 


14 


group  had  missing,  incomplete,  or  out-of-date  addresses,  and  steps  designed  to  obtain  complete, 
current  addresses  for  these  records  were  unsuccessful. 

Losses  attributable  to  either  ineligibility  or  unlocatability  resulted  in  a  sample  that  was 
87.09%  of  the  drawn  sample.  Individuals  in  this  remaining  sample  may  be  further  categorized  as 
nonrespondents  versus  respondents.  Nonrespondents  included  the  following  groups:  sample 
members  who  contacted  the  operations  contractor  (by  mail,  fax,  e-mail,  Web,  or  telephone)  and 
asked  to  have  their  names  removed  from  the  survey  mailing  list,  and  21,231  sample  members 
who  did  not  return  a  survey. 

Respondents  included  all  sample  members  who  completed  on  the  Web  50%  of  applicable 
questions5.  Respondent  also  needed  to  answer  the  two  critical  questions  that  determined 
eligibility  (your  marital  status  and  your  spouse  military  status).  At  the  conclusion  of  the  survey 
fielding,  11,001  eligible,  locatable  sample  members  had  returned  usable  surveys 

Location,  Response  and  Completion  Rates 

Beginning  in  1995,  DMDC  standardized  its  methods  for  calculating  response  rates  and 
completion  rates  using  procedures  patterned  after  those  advocated  by  the  Council  of  American 
Survey  Research  Organizations  (CASRO).  CASRO  noted  that  varying  operational  definitions  of 
response  rates  can  lead  to  problems  or  confusion  (e.g.,  when  awarding  contracts  requiring  pre¬ 
specified  response  rates  or  when  interpreting  the  results  of  a  survey).  As  a  result,  CASRO 
formed  a  task  force  to  recommend  guidelines  for  standardizing  the  operational  definitions  of 
response  rates.  The  new  DMDC  procedures  closely  follow  CASRO ’s  Sample  Type  II  design 
(see  Council  of  American  Survey  Research  Organizations,  1982). 

Table  5  provides  location,  response,  and  completion  rate  information.  The  location  rate 
is  defined  as  the  proportion  of  eligible  sample  members  that  were  located.  The  completion  rate 
is  defined  as  the  proportion  of  the  located  sample  that  returned  usable  surveys.  The  response  rate 
is  defined  as  the  proportion  of  eligible  sample  members  that  returned  usable  surveys. 


Table  5. 

Location  Rates,  Response  Rates,  and  Completion  Rates 


Observed  Operational  Rates 

Weighted  Operational  Rates 

Location  rate  for  eligible 

96.2% 

96.7% 

Completion  rate  for  eligible 

35.4% 

38.1% 

Response  rate  for  eligible 

34.1% 

36.8% 

5  Applicable  questions  are  those  to  be  completed  by  all  respondents  and  excluded  items  that  could  be  skipped  over 
depending  on  prior  answers. 


15 


Survey  Development  and  Administration 


The  2006  RCSS  continues  a  line  of  research  on  active-duty  spouses  begun  with  the  1985 
DoD  Surveys  of  Officer  and  Enlisted  Personnel  and  Military  Spouses.  In  1992  and  1999, 

DMDC  conducted  subsequent  Joint  Service  surveys  of  active-duty  spouses.  Many  key  topics 
covered  by  the  2006  RCSS  were  also  included  in  its  predecessors;  however,  questions  have  been 
updated,  expanded,  or  streamlined  in  the  2006  RCSS.  The  survey  was  administered  by  both 
Web  and  paper- and -pencil  questionnaires.  Although  both  surveys  largely  covered  the  same 
content,  the  Question  numbering  differed.  Both  survey  fonns  are  in  Appendix  A. 

The  survey  was  hosted  on  the  operations  contractor’s  secure  Web  site  so  that  sample 
members  could  complete  the  survey  online.  At  the  entry  point  to  the  survey,  sample  members 
were  prompted  for  their  personal  ticket  number  to  gain  entry  to  the  survey.  The  Privacy  Notice 
and  a  page  of  frequently  asked  questions  (FAQ’s)  were  linked  from  here. 

The  survey  allowed  respondents  to  return  to  the  previous  page  or  move  to  the  next  page. 
In  addition,  buttons  located  below  the  last  Question  on  each  page  allowed  the  respondent  to  clear 
their  response(s)  or  save  and  exit  the  survey.  Questions  were  answered  by  clicking  on  radio 
buttons,  check  boxes  or  by  making  a  choice  from  a  drop-down  list.  The  respondent  could  change 
answers  or  could  save,  exit,  and  return  at  another  time  to  change  answers.  The  final  page  had 
another  “Save  and  Exit”  button  and  a  “Done”  button,  both  with  full  text  explanation  of  their 
functions. 

For  those  people  who  had  not  completed  the  questionnaire  on  the  Web  system,  we  mailed 
the  paper  fonn  to  sample  members  along  with  the  third  reminder  send  on  December  19th  2005 
(see  Table  6  for  more  information  on  the  mailings). 

Survey  Administration 

The  survey  administration  process  began  in  November  2005,  with  the  mailout  of 
notification  letters  to  sample  members  (minus  original  ineligibles).  The  original  field  period  was 
November  7,  2005,  through  February  9,  2006.  Up  to  three  additional  postal  communications 
were  mailed  to  sample  members  throughout  this  field  period.  The  survey  field  was  re-opened  on 
May  1,  2006,  in  order  to  communicate  with  3,566  sample  members  originally  misclassified  as 
ineligible.  During  the  May  field  period,  a  postal  notification  and  one  postal  reminder  were  sent. 
The  field  closed  on  June  1,  2007. 

In  addition,  sample  members  a  valid  with  e-mail  address  on  record  could  have  received 
an  e-mail  notification  plus  up  to  eight  e-mail  reminders  during  the  November-February  field 
period.  During  the  May  field  period,  sample  members  with  a  valid  e-mail  address  could  have 
received  an  e-mail  notification  and  up  to  four  e-mail  reminders.  Postal  and  e-mail  mailings 
stopped  once  the  sample  member  returned  a  survey. 

May  fielding  ticket  miss  match 

Identified  Issue:  DRC  associated  incorrect  Web  Ticket  Numbers  for  the  re-field  when 
preparing  the  postal  Notification  letter,  dated  May  1,  2006.  This  was  discovered  by  DRC 


16 


approximately  two  weeks  after  the  Notification  letters  were  mailed.  The  mismatch  resulted  in 
Web  Survey  returns  with  incorrect  Ticket  Numbers. 

Subsequent  postal  communications  for  the  re-field  provided  correct  Ticket  Numbers.  All 
e-mail  communications  for  the  re-field  provided  correct  Ticket  Numbers. 

Process:  After  discussions  with  DMDC,  the  following  screener  question  was  presented 
prior  to  the  first  survey  question.  This  was  posted  May  18,  2006  at  1 1 :06  AM  CDT. 

Did  you  access  this  survey  using  the  Ticket  Number  from  a  postal  letter  dated 
May  1,2006? 

Yes  (value  =1) 

No  (value  =  2) 

If  the  Next  Page  button  was  selected  without  answering  the  question,  respondents 
received  a  reminder  pop-up  that  it  had  not  been  answered.  After  one  reminder,  respondents  were 
allowed  to  advance  to  Question  1  of  the  survey. 

Analysis:  There  were  179  Web  survey  returns  (165  complete  and  14  partial.) 

•  Using  the  screener  question,  the  survey  submit  date  (SRDATE),  and  presence  of  an  e- 
mail  address,  DRC  determined  the  correct  data  match  for  159  Web  returns. 

•  There  were  20  returns  that  the  source  could  not  be  identified.  These  returns  were 
designated  as  .B  in  the  dataset. 

•  The  variable  created  for  these  returns  was  MIS  MTCH  where 

-  .  =  not  mis-matched 

-  0  =  Corrected  Ticket 

-  1  =  Recodes  Un-Matchable 


17 


Table  6. 

Mailing  Timeline  and  Return  Results 


Print  File 

Number 

Creation 

Mail  Drop 

Number 

of 

Mailing  Numbers  and  Groups 

Date* 

Date 

Sent 

PNDs 

Notification  Domestic 

10/28/05 

11/7/05 

32,631 

2,051 

Notification  Foreign 

10/28/05 

11/7/05 

72 

21 

Notification  Domestic  Reminder  1 

11/22/05 

11/21/05 

748 

122 

Notification  Foreign  Reminder  1 

1 1/22/05 

11/21/05 

26 

8 

Subtotal:  Notification 

33,477 

2,202 

Reminder  1  Domestic 

11/28/05 

12/1/05 

31,128 

1,459 

Reminder  1  Foreign 

11/28/05 

12/1/05 

96 

49 

Reminder  1  Domestic  Re -mail  1 

12/6/05 

12/7/05 

706 

108 

Reminder  1  Foreign  Re -mail  1 

12/6/05 

12/7/05 

76 

43 

Subtotal:  Reminder  1 

32,006 

1657 

Reminder  2  Domestic 

12/9/05 

12/14/06 

28,389 

793 

Reminder  2  Foreign 

12/9/05 

12/14/06 

162 

96 

Reminder  2  Domestic  Re -mail  1 

12/16/05 

12/19/06 

305 

51 

Subtotal:  Reminder  2 

12/16/05 

12/19/06 

25 

13 

Reminder  3  Domestic 

28,881 

953 

Reminder  3  Foreign 

12/19/05 

1/5/06 

27,112 

597 

Reminder  3  Domestic  Re -mail  1 

12/19/05 

1/5/06 

148 

72 

Reminder  3  Foreign  Re -mail  1 

1/10/06 

1/11/06 

662 

102 

Reminder  3  Domestic  Re -mail  2 

1/10/06 

1/11/06 

34 

13 

Subtotal:  Reminder  3 

1/16/06 

1/17/06 

10 

4 

Reminder  4  Domestic 

27,966 

788 

Reminder  4  Foreign 

1/19/06 

1/26/06 

23,181 

170 

Reminder  4  Domestic  Re -mail  1 

1/19/06 

1/26/06 

84 

10 

Reminder  4  Foreign  Re -mail  1 

1/30/06 

1/31/06 

168 

2 

Subtotal:  Reminder  4 

1/30/06 

1/31/06 

4 

0 

May  2006  Notification  Domestic 

23,437 

180 

May  2006  Notification  Foreign 

4/25/06 

5/1/06 

3,518 

272 

Subtotal:  May  2006  Notification 

4/21/06 

5/1/06 

33 

23 

May  2006  Reminder  1  Domestic 

3,551 

295 

May  2006  Reminder  1  Foreign 

5/10/06 

5/15/06 

3,481 

250 

May  2006  Reminder  1  Domestic  Re-mail  1 

5/10/06 

5/15/06 

33 

12 

May  2006  Reminder  1  Domestic  Re-mail  2 

5/22/06 

5/23/06 

134 

5 

Subtotal:  May  2006  Reminder  1 

5/22/06 

5/23/06 

7 

1 

3,658 

268 

*Print  file  creation  date:  This  is  the  date  records  were  identified  for  inclusion  in  the  mailing  and  written  to  a  print 


file. 


18 


Survey  Control  System 


The  Survey  Control  System  (SCS)6  was  used  to  monitor  the  data  collection  process  and 
to  track  all  data  transactions  over  the  course  of  the  survey  administration.  The  datasets  in  the 
SCS  include  sample  members’  names  and  addresses,  but  do  not  contain  data  obtained  from  the 
survey  instruments.  Because  of  privacy  concerns,  SCS  datasets  are  not  available  for  basic 
release. 


The  operations  contractor  uses  the  SCS  to  store  and  update  project  data,  monitor 
mailings,  respond  to  documents  returned  as  postal  non-deliverables  (PNDs),  and  detennine 
survey  participation  and  eligibility  status.  The  SCS  consists  of  five  datasets:  the  ORIGDAT  file, 
the  ADDRESS  file,  the  MASTER  file,  the  HISTORY  file,  and  the  MAILING  file.  Figure  1 
displays  the  relationships  among  those  datasets. 


Figure  1. 

Survey  Control  System 


ORIGDAT  file.  The  ORIGDAT  file  consists  of  38,549  records,  one  record  for  each 
member  of  the  sample.  It  is  the  original  sampling  frame  file  sent  to  the  operations  contractor  by 
DMDC.  The  original  file  is  loaded  onto  the  operations  contractor’s  computer  system  and 
converted  to  a  SAS  dataset.  As  the  file  was  converted  into  a  SAS  dataset,  the  SCS  generated  a 


6The  SCS  refers  to  the  set  of  data  files  as  well  as  the  program  or  operating  system  which  maintains  those  files. 


19 


unique  identification  number  (INRECNO)  for  each  record.  This  number  identifies  the  sample 
member  throughout  the  SCS  and  also  in  returns  data  sets,  comment  text  files  and  other  specify 
text  files.  The  names  and  some  demographic  data  from  the  ORIGDAT  file  were  loaded  into  the 
MASTER  file  in  preparation  for  the  first  mailing.  The  addresses  from  the  ORIGDAT  file  were 
loaded  into  the  ADDRESS  file. 

ADDRESS  file.  The  ADDRESS  file  tracked  the  postal  and  e-mail  addresses  that  were 
maintained  for  each  sample  member.  The  ADDRESS  file  contains  one  record  for  each  postal 
and  address  for  each  sample  member  (e.g.,  if  there  were  five  addresses  located  for  one  sample 
member  during  the  survey  administration,  that  sample  member  has  five  separate  records  in  the 
ADDRESS  file)  yielding  an  ADDRESS  file  containing  164,819  records.  Each  record  is  uniquely 
identified  by  the  combination  of  INRECNO  (identifying  the  sample  member)  and  an  address 
number  (ADDRNO)  assigned  to  each  address.  This  address  number  is  the  sequential  order  of 
receipt  of  the  address  for  a  particular  sample  member.  For  example,  if  a  sample  member  has  one 
address  record  in  the  ADDRESS  file,  the  address  number  for  that  record  is  one.  If  the  sample 
member  faxed  in  a  change  of  postal  or  e-mail  address  or  a  credit  bureau  forwarded  an  updated 
postal  address  for  that  sample  member,  the  new  address  was  added  as  address  number  two.  The 
ADDRESS  file  was  initially  loaded  with  postal  and  e-mail  addresses  from  the  ORIGDAT  file. 
Each  record  in  the  ADDRESS  file  includes  the  sample  member’s  INRECNO,  address,  the  source 
of  the  address,  and  address  priority  code,  a  variable  indicating  whether  the  record  is  the  highest 
priority  address  for  this  sample  member,  and  variables  indicating  whether  the  address 
successfully  reached  the  sample  member. 

The  priority  code  assigned  to  a  given  address  number  for  a  sample  member  was  used  to 
determine  the  “best”  or  “highest  priority”  address  for  the  sample  member  at  any  given  time.  It 
was  originally  determined  by  the  source  of  the  address.  Address  updates  obtained  directly  from 
a  sample  member  received  a  priority  number  of  one.  The  order  of  priority  of  address  sources 
from  “highest  priority”  to  “lowest  priority”  is  as  follows,  respectively: 

1.  updates  directly  from  a  sample  member  (call,  fax,  e-mail,  Web  update  or  letter) 

2.  address  corrections  from  the  U.S.  postal  service  (ACS  [electronic  address  change 
service],  ACRs  [address  correction  requests],  and  ODFs  [out-of-date-forwarded  mail]) 

3.  NCOA-updated  addresses 

4.  credit  bureau-updated  addresses 

5.  DEERS  residential  addresses 

6.  DEERS  unit  addresses 

MASTER  file.  The  MASTER  file  is  used  by  the  SCS  to  select  records  for  upcoming 
survey  mailings.  This  file  includes  a  record  for  each  member  of  the  sample  and  was  initially 
created  by  extracting  data  from  each  record  in  the  ORIGDAT  file.  Each  MASTER  record 
includes  the  sample  member  INRECNO  and  the  address  number  for  the  highest  priority  postal 
and  e-mail  address  in  the  ADDRESS  file  for  this  sample  member.  The  MASTER  file 
accommodated  data  updates  through  an  automated  process  (e.g.,  updating  the  address  number  in 


20 


use  after  the  receipt  of  a  postal  or  e-mail  nondeliverable  or  Web  update)  or  manual  key  entry 
(e.g.,  updating  infonnation  in  response  to  a  telephone  call,  fax,  letter  return  or  e-mail  from  a 
sample  member).  As  new  information  was  received  for  a  particular  record  (including  changes  to 
the  highest  priority  address),  the  SCS  updated  the  MASTER  record  (N=38,549)  and  wrote  the 
old  record  to  the  HISTORY  file.  The  MASTER  file  also  contains  a  set  of  variables  which 
summarize  the  sample  member’s  mailings  status. 

HISTORY file.  The  HISTORY  file  is  a  chronicle  of  the  changes  that  occurred  to  the 
MASTER  file.  Each  HISTORY  record  is  a  subset  of  an  outdated  MASTER  record  with  the 
addition  of  a  date  and  time  stamp  as  the  record  is  updated.  That  is,  a  HISTORY  record  is  created 
when  there  is  a  name,  address,  paygrade,  or  eligibility  status  change  in  the  MASTER  file.  Thus, 
the  HISTORY  file  contains  as  many  observations  as  there  are  updates  to  the  MASTER  file. 

MAILING  file.  The  MAILING  file  tracked  all  survey  mailings  (postal  and  e-mail).  This 
file  contains  one  record  for  either  an  item  postal  mailed  or  e-mailed  during  the  survey 
administration  or  for  tracking  postal  address  updates  from  credit  bureaus  (N=  15 3, 2 19).  Each 
MAILING  record  includes  the  INRECNO,  address  number  used,  date  of  mailing,  mailing  status, 
type  of  mailing,  and  the  mailing  identification  code  (MIC). 

Address  Update  Procedures 


Initial  Address  Updates 

Prior  to  the  first  mailing,  the  operations  contractor  ensured  all  domestic  residential 
addresses  were  formatted  to  conform  to  U.S.  Postal  Service  standards.  Once  the  addresses  were 
standardized,  they  were  sent  to  an  outside  vendor  where  they  were  checked  against  the  National 
Change  of  Address  (NCOA)  database.  The  NCOA  software  updated  the  address  records  (in 
standardized  format)  based  on  change-of-address  cards  filed  with  the  U.S.  Postal  Service.  The 
updated  NCOA  address  file  was  returned  to  the  operations  contractor  and  integrated  into  the 
SCS.  The  NCOA-updated  addresses  were  added  to  the  ADDRESS  file  and  became  the  current 
ADDRNO  with  the  “highest  priority  code  assigned”  in  the  MASTER  file. 

After  the  NCOA-updated  data  was  added  to  the  SCS,  another  file  was  compiled  of 
sample  members  who  had  an  incomplete  address  or  an  address  identified  by  NCOA  as  an 
undocumented  move  (i.e.,  the  sample  member  had  moved,  but  NCOA  did  not  have  a  new 
address).  The  operations  contractor  sent  copies  of  this  file  to  three  credit  bureaus  (Experian, 
Trans  Union  and  CSC  Credit  Services)7  to  detennine  whether  a  complete,  up-to-date  address  for 
these  sample  members  could  be  found.  The  results  were  integrated  into  the  SCS,  updating 
records  in  the  ADDRESS  file. 

Ongoing  Address  Updates 

Address  update  procedures  also  occurred  when  (a)  additional  address  records  were 
received  after  NCOA  processing,  (b)  a  survey  document  was  returned  as  undeliverable,  (c)  a 


7Experian,  Trans  Union  and  CSC  Credit  Services  are  outside  vendors  with  consumer-credit  information  databases. 
Social  security  numbers  of  sample  members  with  incomplete  or  out-of-date  address  information  were  forwarded  to 
the  vendors  for  address  updates  when  the  mailing  dataset  contained  no  valid  address. 


21 


sample  member  self-reported  a  name,  rank,  or  address  change,  or  (d)  the  U.S.  Postal  Service 
forwarded  address  correction  information.  Figure  2  outlines  these  procedures. 


Figure  2, 

Address  Updating  Procedures 


As  a  new  address  was  entered  into  the  ADDRESS  fde,  its  source  (NCOA,  credit  bureau, 
postal  Address  Correction  Requested  card,  telephone  call,  fax,  letter,  Web,  or  e-mail)  was 
recorded  and  a  new  address  number  was  assigned.  The  priority  assigned  to  the  address  was 
based  upon  the  source  of  the  update  and  the  date  and  time  of  the  address  (see  the  description  of 
priority,  for  the  ADDRESS  file).  At  any  given  time,  the  current  address  used  corresponded  to 
the  address  number  with  the  highest  priority  code. 


22 


If  all  known  addresses  for  a  sample  member  were  returned  Postal  Non-Deliverable  Mail 
(PND),  the  sample  member’s  record  in  the  MASTER  file  was  flagged  “no  address  available.” 

All  “no  address  available”  records  were  forwarded  to  the  three  credit  bureaus.  The  credit 
bureaus  returned  files  containing  addresses  for  each  submitted  record,  with  the  date  on  which  the 
credit  bureau  received  the  address.  If  more  than  one  address  for  a  sample  member  was  received 
from  credit  bureaus,  the  address  number  corresponding  to  the  address  with  the  most  recent 
receipt  date  received  the  highest  priority  code.  If  one  or  more  of  the  credit  bureaus  returned  a 
previously  unattempted  address,  the  MASTER  and  ADDRESS  files  were  updated  and  a  re -mail 
was  sent  to  the  sample  member.  If  none  of  the  vendors  had  an  updated  address  for  the  sample 
member,  the  operations  contractor  designated  the  sample  member  “nonlocatable”  and  stopped 
further  mailings. 


Processing  of  Updates 
Updates  from  Sample  Members 

Sample  members  could  provide  an  updated  address  in  a  variety  of  ways.  Updates  from 
sample  members  could  be  communicated  via  the  toll-free  telephone  number  (either  by  speaking 
to  the  operations  contractor’s  Call  Center  staff  or  by  leaving  a  voice  mail  message).  In  addition, 
sample  members  could  mail,  fax,  e-mail  or  the  survey  Web  site  all  updated  infonnation  was 
entered  into  the  SCS.  Updates  made  on  the  Web  site  were  loaded  directly  into  the  SCS  before 
the  start  of  the  survey;  once  the  survey  fielding  period  stared,  the  Web  update  page  was  no 
longer  available.  Other  updates  were  entered  into  the  SCS  by  the  operations  contractor’s  Call 
Center  staff  by  the  close  of  business  on  the  day  following  receipt  of  the  update. 

Updates  from  the  U.S.  Postal  Service 

There  are  several  types  of  address  updates  provided  by  the  postal  service.  They  are 
detailed  below;  each  includes  a  description  of  the  processing  steps. 

1 .  Postal  Non-Deliverable  Mail  (PND):  The  sample  member  moved  and  no  forwarding 
address  was  available.  The  mail  piece  was  returned  to  the  operations  contractor.  The 
operations  contractor  removed  the  letter  from  the  envelope  and  scanned  it  to  capture 
the  Mailing  Identification  Code  (MIC)  in  the  lower  right  corner.  A  file  of  the  MICs 
was  loaded  to  the  SCS  so  the  records  could  be  updated  as  PND.  This  was  done  as 
necessary  to  coincide  with  the  mailing/re-mailing  schedule.  If  sample  member  had 
another  address  on  file  (e.g.,  the  unit  address),  that  address  was  used  for  the  next 
mailing  for  the  next  mailing.  If  no  alternate  address  was  on  file,  the  Social  Security 
Number  was  sent  to  the  credit  bureaus  in  search  of  a  new  address. 

2.  Address  Change  Service  (ACS;  electronic):  About  six  weeks  prior  to  the  first  mailing, 
the  operations  contractor  applied  to  the  postal  service  for  the  ACS.  The  postal  service 
assigned  a  participant  code,  which  was  placed  in  the  address  block  of  the  letter.  The 
operations  contractor  requested  semi-weekly  files,  which  the  postal  service  provided 
on  diskette  via  Express  Mail.  The  operations  contractor  loaded  the  files  upon  receipt 
or  before  another  mailing  was  prepared. 


23 


3.  Address  Correction  Requests  (ACR;  hard-copy):  The  outbound  envelopes  contained 
the  endorsement  “Address  Service  Requested.”  The  post  office  provided  the 
corrections  via  hard  copy  cards  that  were  sent  to  the  operations  contractor.  The 
corrections  were  entered  into  the  SCS  by  the  operations  contractor’s  Call  Center  staff, 
typically  by  close  of  business  the  next  day  but  no  later  than  prior  to  the  preparation  of 
the  next  mailing. 

Survey  Materials  and  Their  Distribution 

Each  eligible  sample  member  received  at  most  four  original  mailings:  a  notification 
letter  and  brochure  explaining  the  survey  program,  a  reminder  letter,  a  reminder  letter  with  a 
paper  survey  and  a  third  reminder  letter.  The  notification  and  reminder  letter  mailings  contained 
a  letter,  except  for  the  second  reminder  which  contained  a  letter,  paper  survey  and  business  reply 
envelope.  All  letters  included  information  about  using  the  Web  as  an  option  to  complete  the 
survey. 


In  addition,  e-mail  was  used  to  communicate  with  sample  members.  Not  every  sample 
member  had  an  e-mail  address.  However,  those  sample  members  for  whom  we  had  an  e-mail 
address  received  an  e-mail  announcement  and  up  to  eight  e-mail  reminders.  Samples  of  the 
letters  and  e-mail  communications  are  provided  in  C. 

General  Mailing  Procedures 

Prior  to  every  mailing,  the  SCS  searched  the  records  in  the  MASTER  file  to  identify 
which  records  should  be  excluded  (e.g.,  sample  members  self-reported  as  ineligible  for  survey 
participation,  sample  members  who  had  already  returned  survey  forms,  and  members  with  no 
valid  addresses  available).  For  re-mails  (sent  between  mailings),  the  SCS  identified  only  those 
records  that  had  been  updated  since  the  prior  mailing.  More  specifically,  the  SCS  identified 
records  that  had  resulted  in  PNDs  or  had  been  manually  flagged  for  re-mailing  (e.g.,  in  response 
to  a  sample  member  calling  the  operations  contractor  stating  she  or  he  had  received  a 
reminder/thank  you  letter  but  had  not  received  a  survey,  etc.). 

Once  all  records  for  a  particular  mailing  or  re-mailing  were  identified,  the  SCS  processed 
the  records  based  on  whether  the  mailing  would  include  a  brochure  and/or  a  survey  fonn.  If  the 
mailing  group  was  large  enough  to  lead  to  a  cost  savings  from  sorting,  the  records  were  run 
through  Group  1  postal  software  to  sort  the  records  according  to  first-class  presort  postal 
regulations.  After  this  procedure,  a  unique  Mail  Identification  Code  (MIC)  was  assigned  to  each 
record.  The  MIC  was  assigned  either  from  the  survey  litho  code  list  if  a  survey  form  was  sent  or 
independently  if  only  a  letter  was  sent. 

Ticket  Numbers  for  Web  Survey  Access 

Prior  to  the  first  mailing,  a  list  of  ticket  numbers  for  Web  survey  access  was  randomly 
generated.  One  secure  ticket  number  was  assigned  to  each  sample  member  and  remained  linked 
to  that  member  for  the  duration  of  the  project.  That  is,  while  a  member’s  MIC  or  lithocode 
changed  with  each  mailing  as  described  previously,  the  member’s  ticket  number  did  not  change. 


s  Ticket  numbers  are  eight  alpha  numeric  characters  generated  at  random. 


24 


The  member’s  unique  ticket  number  was  printed  (along  with  the  survey  URL)  in  each  letter  and 
e-mail  sent  to  that  individual.  A  member  could  not  access  the  Web  survey  without  using  his  or 
her  ticket  number. 

Description  of  Letters 

Letters  were  printed  with  the  record’s  unique  MIC  listed  in  the  address  field  and  on  the 
lower  right  corner  of  the  letter.  If  the  mailing  included  only  letters  (no  brochures  or  survey 
forms),  the  letters  were  folded  and  machine  inserted  into  window  envelopes  and  sent  by  first 
class  mail.  Mailings  that  included  a  brochure  or  a  survey  followed  the  same  procedure  through 
the  letter  printing  process.  The  MIC  on  the  cover  letter  was  used  to  pair  the  letter  with  the 
correct  enclosure.  During  the  matching  process,  ten  percent  of  the  mailing  was  visually  checked, 
comparing  numbers  printed  on  the  letter  with  the  brochure  or  survey  number  for  quality  control. 
Any  mismatched  pairs  initiated  further  investigation  of  the  matching  process.  This  procedure 
ensured  that  each  brochure  or  survey  was  sent  to  the  person  designated  to  receive  it.  Depending 
on  the  sample  size,  the  letters  and  matched  enclosures  were  machine  or  hand  inserted  into 
envelopes,  metered  if  necessary,  and  sent  by  first  class  mail. 

The  status  of  each  mailing  was  tracked  throughout  the  data  collection  so  that  address- 
correction  information  could  be  incorporated  into  all  relevant  mailings.  When  a  mail  piece  came 
back  PND,  the  next  mail  piece  was  sent  to  a  new  address  (if  one  could  be  obtained  during  the 
mailing  period).  For  all  mail  pieces  that  came  back  PND,  re-mails  were  completed  if  a 
newer/updated  address  could  be  found. 

DMDC  provided  the  operations  contractor  with  the  text,  letterhead  and  signature  for  the 
cover  letters.  The  letters  explained  why  the  survey  was  being  conducted,  how  the  survey 
information  would  be  used,  and  why  participation  was  important.  See  C  for  copies  of  the  letters. 
The  letters  were  approved  and  printed  on  letterhead  from  the  office  of  the  Under  Secretary  of 
Defense  and  signed  by  the  Under  Secretary  of  Defense  (Personnel  and  Readiness),  David  S.C. 
Chu.  The  letterhead  and  signature  were  printed  in  blue,  and  the  text  and  recipient  infonnation  of 
all  letters  were  printed  in  black.  In  addition  to  including  a  name  and  address  (which  was  also 
used  as  the  mailing  information  for  the  window  envelopes),  each  letter  included  a  personalized 
salutation.  The  salutation  addressed  each  sample  member  by  his/her  gender.  For  example,  a 
letter  to  an  Active  Duty  spouse  would  have  included  the  salutation,  “Dear  Mrs.  Smith.” 

Mailouts 

Table  6  lists  the  mailing  dates  and  return  results  for  each  of  the  mailouts  and  re-mailings. 
For  the  main  notification  mailing,  sample  members  were  sent  a  letter  and  brochure  notified 
sample  members  that  they  were  selected  for  this  survey  and  encouraged  their  participation.  The 
notification  letter  was  mailed  to  32,703  sample  members  on  November  7,  2005. 

The  first  reminder  letter  was  sent  to  3 1,224  sample  members  on  December  1,  2005.  The 
letter,  thanked  sample  members  for  completing  the  survey  if  they  had  done  so,  and  reminded 
them  to  complete  the  survey  if  they  had  not.  The  second  reminder  letter  was  sent  to  28,55 1 
sample  members  on  December  14,  2005.  The  letter  again  thanked  sample  members  for 


25 


completing  the  survey  if  they  had  done  so,  and  reminded  them  to  complete  the  survey  if  they  had 
not. 


The  third  reminder  mailing  provided  sample  members  the  option  to  complete  a  paper 
survey.  For  this  mailing,  a  letter,  paper  survey  and  a  folded  business  reply  envelope  were 
provided.  The  survey  packet  was  mailed  to  27,260  sample  members  on  January  5,  2006. 

The  fourth  postal  reminder  letter  was  sent  to  23,265  sample  members  on  January  26, 

2006.  The  letter  thanked  sample  members  for  completing  the  survey  if  they  had  done  so,  and 
reminded  them  to  complete  the  survey  if  they  had  not. 

The  field  was  re-opened  on  May  1,  2006,  as  stated  earlier.  The  second  notification 
packet  was  sent  to  sample  members  initially  flagged  as  ineligible  and  offered  sample  members 
the  option  to  complete  the  survey  on  paper  or  on  a  secure  Web  site.  For  this  mailing,  a  letter, 
paper  survey,  brochure  and  business  reply  envelope  were  provided.  This  packet  was  mailed  to 
3,551  sample  members  on  May  1,  2006. 

A  reminder  letter  was  sent  to  3,5 14  sample  members  on  May  15,  2006.  The  letter 
thanked  sample  members  for  completing  the  survey  if  they  had  done  so,  and  reminded  them  to 
complete  the  survey  if  they  had  not. 

E-mail  was  used  to  communicate  with  sample  members.  E-mail  addresses  were 
purchased  from  an  outside  vendor.  The  outside  vendor  maintains  a  customer  database  of  e-mail 
addresses  that  has  been  lawfully  collected  and  compiled  from  consumers  pursuant  to  a  notice  that 
advised  them  that  their  personal  data  was  being  collected.  Table  7  below  shows  the  percent  of 
sample  members  by  Service  for  whom  at  least  one  valid  e-mail. 


Table  7. 

E-mail  Address  Availability  by  Service 


Army 

National 

Guard 

Army 

Reserve 

Navy 

Reserve 

Marine 

Corps 

Reserve 

Air 

National 

Guard 

Air  Force 
Reserve 

Total 

Valid  address  available 

14% 

15% 

14% 

14% 

15% 

14% 

14% 

No  valid  address  available 

86% 

85% 

86% 

86% 

85% 

86% 

86% 

Sample  members  with  e-mail  addresses  received  at  most  an  e-mail  notification  and  eight 
reminders.  Table  8  lists  the  e-mail  dates  and  e-mail  addresses  bounced.  E-mail  addresses 
“bounced”  identifies  the  address  was  invalid  at  the  time  DMDC  attempted  contact.  This  is 
analogous  to  a  postal  PND.  E-mail  address  “sent”  is  not  the  same  as  e-mail  received.  It  is 
analogous  to  the  non-PND  return  experienced  during  a  mailed  survey.  It  is  not  known  if  the  mail 
was  delivered  to  the  intended  individual,  only  that  it  was  not  returned. 


26 


Table  8. 

E-mail  Communication  Timeline 


Communication  Type 

E-mail  Drop  Date 

Number  Sent 

Number  Bounced 

Notification 

11/21/05 

6486 

10215 

Reminder  1 

11/28/05 

4467 

41 

Reminder  2 

12/6/05 

3833 

20 

Reminder  3 

12/14/05 

3336 

23 

Reminder  4 

12/22/05 

3054 

15 

Reminder  5 

12/28/05 

2928 

24 

Reminder  6 

1/11/06 

2693 

27 

Reminder  7 

1/19/06 

2439 

27 

Reminder  8 

1/27/06 

2306 

0 

May  Notification 

5/1/06 

129 

41 

May  Reminder  1 

5/5/06 

123 

0 

May  Reminder  2 

5/11/06 

90 

0 

May  Reminder  3 

5/17/06 

86 

0 

May  Reminder  4 

5/23/06 

81 

0 

Processing  Returned  Surveys 

Once  a  respondent  completes  the  survey,  data  are  stored  in  an  indexed  file  on  the  Web 
(data)  server.  Web  and  paper  survey  returns  are  merged  into  one  dataset.  Paper  survey  returns 
require  additional  work  to  input  the  data  (explained  below).  Prior  to  providing  each  dataset  to 
DMDC,  the  operations  contractor  copied  the  indexed  file  to  their  internal  network  using  FTP 
protocol.  The  data  are  then  converted  to  a  sequential  fonnat,  and  the  validate  program  reads  and 
loads  the  data  to  the  dataset. 

All  paper  returned  surveys  were  logged  in  and  opened  by  the  operations  contractor  upon 
receipt.  If  the  envelope  contained  the  survey  booklet  and  other  materials  (e.g.,  extra  comments, 
photographs,  non-relevant  items),  the  operations  contractor  separated  it  from  the  survey. 

Bundles  of  this  type  of  correspondence  (white  mail)  were  sent  to  DMDC  by  regular  surface  mail 
or  FedEx  ground  after  all  surveys  were  received.  If  the  white  mail  appeared  to  be  urgent,  the 
operations  contractor  contacted  DMDC  to  detennine  how  it  should  be  handled. 

Survey  booklets  were  batched  for  image  scanning  and  assigned  a  batch  number.  The 
booklets  were  separated  by  pages,  stacked  in  page/booklet,  and  forwarded  for  scanning.  As  the 
surveys  were  scanned,  the  batch  number  and  a  serial  number  (unique  to  each  survey)  were 
printed  on  each  page  of  the  survey. 

The  surveys  were  machine-edited  for  light  marks,  multiple  marks,  and  alignment. 
Damaged  forms  were  repaired,  if  possible,  and  scanned  with  non-damaged  forms.  If  it  was  not 
possible  to  scan  the  documents,  they  were  batched  separately  and  key-entered. 


27 


Regardless  of  the  mode  of  survey  submission,  the  operations  contractor  processed  all 
survey  information  according  to  DMDC  approved  administration  plans  and  coding  schemes. 

DMDC  Coding  Scheme 

To  convert  the  raw  data  into  the  item  scores  that  appear  in  the  data  files  (basic-release 
and  confidential  files),  DMDC  provided  the  operations  contractor  with  an  annotated  copy  of  the 
survey  form  (see  D)  and  the  coding  notes  (see  E).  Every  attempt  is  made  to  capture  all 
information  from  completed  surveys  and  preserve  the  data  so  that  secondary  analysts  can  later 
create  variables  that  were  not  anticipated  by  DMDC  researchers.  To  accomplish  these  goals, 
DMDC  subscribes  to  a  variety  of  coding  conventions  (see  D).DMDC  uses  “forward”  coding 
when  coding  inconsistent  answers  in  items  with  skip  patterns.  Data  on  the  starting  Question 
accepted  as  marked  and  data  for  the  items  within  the  skip  pattern  are  edited  to  be  consistent  with 
the  starting  question.  However,  an  unedited  version  of  each  item  is  preserved  in  a  confidential 
dataset. 


For  Web  respondent,  the  coding  scheme  is  used  to  “smart  skip”  respondent.  This  does 
not  allow  respondents  to  view  questions  that  they  have  indicated  with  previous  answers  do  not 
apply  to  them.  For  example,  if  a  respondent  indicated  on  question  18  (SR018=  1)  that  they  had 
not  ever  tried  to  go  to  a  military  installation  becoming  a  spouse  of  a  National  Guard/Reserve 
since  then  they  did  not  see  Question  19,  which  asks  “Since  becoming  a  National  Guard/Reserve 
spouse,  have  any  of  the  following  caused  you  problems  in  gaining  access  to  your  nearest  military 
installation?”  Only  those  with  the  affirmative  answer;  yes,  I  have  tried  to  go  to  a  military 
installation  are  shown  the  questions. 

Coding  or  Keying  Open-Ended  Items 

The  Web  survey  contained  twenty-one  open-ended  items.  The  original  text  responses 
from  these  items  were  captured  verbatim  into  a  SAS®  data  set  that  is  linked  by  the  unique 
identification  to  the  survey  data.  The  paper  fonn  had  fewer  open-ended  items.  The  operations 
contractor  keyed  all  verbatim.  Text  data  in  the  SAS®  files  for  open-ended  items  were  spell¬ 
checked.  Identifiers  (e.g.,  proper  names,  addresses,  e-mail  addresses,  phone  numbers,  locations, 
or  social  security  numbers)  were  replaced  with  generic  terms. 

Fifty-Record  Check 

After  receiving  the  5%  of  returned  records,  the  operations  contractor  ran  a  “50-record 
check.”  This  is  a  check  to  verify  that  the  coding  scheme  and  skip  patters  are  working.  DMDC 
checked  the  resulting  data  to  detennine  if  there  were  any  unanticipated  problems  in  the  coding 
procedures  (e.g.,  respondents  were  consistently  answering  in  an  unexpected  manner).  Minor 
corrections  to  these  procedures  were  necessary  as  a  result  of  this  check  and  were  reviewed  by 
DMDC  prior  to  production  of  the  initial  SAS®  dataset.  At  the  completion  of  the  50-record 
check,  the  operations  contractor  compiled  the  full  set  of  returned  surveys.  The  data  were  then 
cleaned  and  edited  following  the  coding  scheme. 


28 


Survey  Analysis  Files 


This  section  (a)  provides  an  overview  of  requirements  for  analysis  of  the  data, 

(b)  documents  the  structure  of  survey  analysis  fdes  created  for  the  2006  RCSS  survey, 

(c)  describes  the  assembly  of  the  analysis  files,  and  (d)  provides  an  overview  of  the  variables  in 
the  survey  analysis  files. 


Estimation 

Analysis  of  this  data  requires  use  of  weights  to  compensate  for  the  unequal  selection 
probabilities  and  to  account  for  differential  nonresponse  among  population  subgroups.  The 
analytic  weights  were  poststratified  to  population  totals  so  that  weighted  sample  estimates  would 
reflect  population  values. 

In  general,  the  procedures  used  to  compute  sample  estimates  of  population  parameters 
(including  population  totals,  means,  proportions),  tests  of  hypotheses,  regression  relations,  and 
their  associated  variances  are  derived  from  the  probability  structure  that  gives  rise  to  the 
observations.  As  with  other  surveys  involving  complex  probability  structures,  most  of  the 
parameter  estimates  of  interest  in  this  survey  take  the  form  of  non-linear  statistics.  Examples 
include  domain  means  and  proportions  where  the  denominator  values  are  unknown  and  must  be 
estimated  from  the  sample  data.  The  estimator  takes  the  form  of  a  ratio  of  random  variables  (i.e., 
the  ratio  of  the  estimated  numerator  and  denominator  totals  or  counts).  In  general,  ratio 
estimates  are  not  unbiased  and  their  variances  cannot  be  expressed  in  closed  form.  The  variances 
are,  therefore,  approximated.  The  bias  in  a  ratio  estimate  depends  on  the  variance  associated 
with  the  denominator  total  or  count  and  can  usually  be  ignored  in  samples  having  a  large  number 
of  observations.  As  a  working  rule,  the  bias  may  be  assumed  negligible  if  the  number  of 
observations  on  which  the  estimate  is  based  exceeds  30  or  is  otherwise  large  enough  so  that  the 
coefficient  of  variation  [SE(x)/x]  of  the  denominator  is  less  than  .10  (cfi,  Cochran,  1977,  pp. 
153-165). 

Two  common  variance  estimation  methods  for  complex  sample  data  are  linearization 
(Taylor  series  approximation)  and  replication.  Wolter  (1985)  provides  a  detailed  discussion  on 
methods  used  for  variance  estimation  from  sample  surveys,  including  Taylor  series 
approximation  and  replication  methods. 

Many  of  the  standard  statistical  software  packages,  such  as  SPSS9  and  older  versions  of 
SAS,10  compute  variance  estimates  only  for  simple  random  samples.  Using  standard  statistical 
programs  with  the  appropriate  eligibility  indicator  (ELIGFLGW)  and  the  analytic  weight 
(FINALWGT)  to  analyze  this  data  will  produce  accurate  point  estimates,  but  variance  estimates 
will  not  account  for  the  complex  sample  design.  Variables  have  been  included  in  the  analysis  file 
so  that  Taylor  series  estimates  can  be  computed  for  a  stratified  without  replacement  design,  using 
either  SUDAAN9  or  the  recently  available  SAS  Survey  Procedures. 


9  SPSS®  is  a  registered  trademark  of  SPSS  Inc.,  Chicago,  IL,  USA. 

10  SAS  added  survey  procedures  in  Version  7,  expanding  them  in  releases  8.0  and  higher. 


29 


Data  Structure 


Care  was  taken  in  the  preparation  of  the  survey  analysis  files  to  provide  basic  access  to 
data  from  the  survey  with  sufficient  information  for  accurate  estimations,  while  meeting 
requirements  for  participant  and  non-participant  anonymity.  As  described  below,  some  detailed 
variables  have  been  deleted  from  the  basic-release  files  either  because  (a)  they  provide  too  great 
a  chance  of  identifying  an  individual  or  (b)  they  are  not  needed  to  analyze  the  survey  data.  For 
the  latter  reason,  some  demographic  variables  are  available  on  basic  files  only  in  a  collapsed 
version.  In  addition  to  a  basic-release  file,  a  confidential  file  (containing  a  more  complete  set  of 
variables  than  the  basic-release  file)  has  been  prepared  for  internal  DMDC  use.  Files  were 
prepared  as  SAS  and  SPSS  system  files.  An  ASCII  (Operating  System  or  OS)  flat  file  was 
prepared  from  the  basic-release  SAS  system  file.  File  names  are  indicated  in  Table  9. 


Table  8. 

Analysis  File  Names 


Type  of  File 

File  Name 

Basic-release  File  -  SAS 
Confidential  File  -  SAS 
Basic-release  File  -  SPSS 
Basic -release  File  -  OS 

RCSS06B.7BDAT 

RCSS06C.7BDAT 

RCSS06B.POR 

RCSS06B.DAT 

The  structure  of  the  confidential  file  is  shown  in  Figure  3.  The  confidential  file  contains 
the  basic-release  file  plus  additional  confidential  variables.  All  variables  in  the  confidential  file 
are  documented  in  this  report.  Appendix  F  and  G  list  all  variables  with  a  notation  to  indicate 
which  variables  are  confidential  and  show  where  each  variable  is  documented.  Intermediate 
weighting  variables  that  appear  only  in  the  confidential  file  are  documented  by  DMDC  (2006a). 
Variables  that  appear  in  collapsed  fonn  in  the  basic-release  part  of  the  file  and  in  a  fuller  version 
only  in  the  confidential  file  are  discussed  later. 

Analyses 

Both  the  confidential  file  and  basic-release  file  contain  38,549  records,  one  for  every 
sampled.  As  depicted  in  Figure  3,  these  records  can  be  divided  into  3  subgroups.  The 
Nonrespondents  subgroup,  includes  all  records  indicated  by  ELIGFLGW=3,  where  no  usable 
response  was  received  or  ineligibility  could  not  be  determined  (27,548). 

Assignment  of  a  record  to  the  other  two  subgroups  was  based  on  whether  (a)  an 
individual  returned  a  “completed”  survey;  and  (b)  the  individual  was  eligible  for  the  survey  . 
Final  eligibility  was  limited  to  those  in  both  the  March  2005  Reserve  Components  Common 
Personnel  Data  System  (RCCPDS)  and  the  July  2005  Defense  Enrollment  Eligibility  Reporting 
System  (DEERS)  Medical  Point-In-Time  Extract  (PITE)  who  did  not  contact  the  operations 
contractor  to  indicate  that  they  were  ineligible. 


30 


The  analytic  dataset  should  consist  of  records  for  the  Known  Self-  or  Proxy-  reported 
Ineligibles  and  Eligible  Respondents  subgroups.  Both  the  Eligible  Respondents  (ELIGFLGW=1) 
and  Known  Self-  or  Proxy-reported  Ineligibles  (ELIGFLGW=2)  are  included  because  both  types 
of  records  were  used  for  poststratification  to  population  totals;  both  types  of  records  are  needed 
to  compute  accurate  variance  estimates  by  Taylor  series  linearization.  To  analyze  the  eligible 
completed  responses  use  the  analytic  weight,  FINALWGT,  subset  the  file  to  ELIGFLGW  =  1,2 
(i.e.,  records  with  non-zero  weights),  and  restrict  the  subpopulation  for  analysis  to 
ELIGFLGW=1. 


Figure  3. 

The  Structure  of  the  Confidential  File 


Confidential  and 
Detailed 
Methodological 


Subgroups  Basic-release  File  Variables 


Eligibility  Flag 
Value  and  Number 
of  Records 

ELIGFLGW=3 
n=  23,690 

ELIGFLGW=2 
n=  1,653 

ELIGFLGW=1 
n=  11,001 


Note.  The  shaded  portion  represents  the  subset  of  the  data  typically  required  for  analysis. 


Variables  in  the  Survey  Analysis  Files 


Basic-survey  Dataset 

The  variables  in  the  basic-survey  dataset  fall  into  five  categories:  (1)  Information 
gathered  on  the  survey,  (2)  Variables  constructed  for  analysis,  (3)  Infonnation  on  operations, 

(4)  Infonnation  from  sampling  and  record  data,  and  (5)  Information  on  weighting.  Variables  are 
grouped  in  these  categories  in  G  and  H. 

Information  gathered  on  the  survey.  These  variables  came  directly  from  the  survey  or 
were  constructed  using  only  information  from  the  survey.  There  is  at  least  one  variable  for  every 
item  in  the  survey  except  for  a  few  items  that  had  to  be  removed  to  preserve  confidentiality.  The 
annotated  questionnaire  (see  Appendix  D)  contains  the  item  names,  the  values  used  to  code  the 
pre-specified  alternatives,  and  references  to  applicable  coding  notes  in  E. 

DMDC  uses  a  standard  naming  convention  for  most  variables.  In  general,  the  survey- 
derived  variables  can  be  classified  as  variables  that  begin  with  either  “SA,”  “SR,”  or  “X.”  The 


31 


naming  of  “SR”  variables  is  reviewed  using  the  example  variable,  “SR052A.”  For  the  2006 
Survey  of  Reserve  Component  Spouses,  variables  names  begin  with  “SR”  to  denote  the 
population  (active-duty  spouse)  and  the  survey  administration  year.  The  following  three 
numbers  correspond  to  the  questionnaire  item  number.  For  example,  the  third  through  fifth 
digits  indicate  the  main  Question  number  (046),  the  sixth  digit  typically  indicates  the  sub- 
Question  item,  such  as  (in  this  example)  item  A  from  a  list  of  items  in  Question  46. 

The  “SR”  variables  are  a  set  of  primarily  demographic  items  that  are  identically  named 
across  all  DMDC  surveys.  The  “SR”  serves  as  a  mnemonic  for  self-report  with  the  remainder  of 
the  name  indicating  the  data  being  collected.  For  example,  “SRRACE”  is  the  variable  name  for 
the  item  that  asks  sample  members  what  race  they  consider  themselves  to  be.  Although  all 
survey  data  are  self-reported,  the  “SR”  is  used  to  distinguish  survey-reported  infonnation  from 
DMDC-provided  information  (e.g.,  the  variable  “SRRACE”  from  the  survey  is  differentiated 
from  the  variable  “RACE”  from  DMDC  databases).  When  possible,  “X”  is  reserved  to  create 
special  crossing  (marginal)  variables  for  key  analyses.  “X”  variables  typically  involve 
imputation  for  missing  data  and,  like  “SR”  variables,  are  intended  to  be  consistent  across  DMDC 
surveys.  For  more  information  on  variable  naming  conventions,  see  Appendix  E. 

Variables  constructed  for  analysis .  An  “R”  as  the  last  letter  of  a  variable  listed  in 
Appendix  F,  G,  and  H  is  an  indication  that  the  variables  may  have  been  recoded  to  create  special 
analysis.  Only  one  version  of  each  variable  is  available  in  basic-dataset.  For  example,  certain 
demographic  variables,  including  some  information  collected  on  the  survey,  had  to  be  censored 
to  preserve  the  anonymity  promised  to  survey  respondents  and  nonrespondents.  For  example, 
SR015R  is  a  recoding  of  SR015. 

Certain  key  demographic  variables  were  constructed  for  DMDC  analyses.  These  analytic 
variables,  starting  with  “X,”  are  based  primarily  on  self-reported  information  from  the  survey. 
Typically,  where  the  self-reported  information  was  missing  on  important  demographics  (e.g., 
Service,  paygrade,  location,  or  respondent  gender)  data  were  imputed  from  members’  or 
spouses’  administrative  record. 

The  race  and  ethnicity  questions  were  combined  to  be  reported  in  accordance  with  the 
Standards  for  Maintaining,  Collecting,  and  Presenting  Federal  Data  on  Race  and  Ethnicity 
(1997).  Also,  items  were  combined  to  derive  employment  indicators  based  on  U.S.  Census 
Bureau’s  Decennial  Census  and  Current  Population  Survey  (2002). 

Appendix  J  documents  many  of  the  decisions  made  in  the  analyses  reported  by  DMDC 
(2006b).  For  a  large  number  of  survey  items,  analysts  must  make  decisions  on  the  treatment  of 
special  codes  (such  as  Not  Applicable.). 


32 


Information  on  operations .  The  DMDC-provided  identification  number,  RCSS2006,  is 
unique  and  is  used  to  identify  responses  as  they  are  processed.  Other  variables  are  created  by  the 
operations  contractor  but  are  too  detailed  to  be  in  the  basic-release  file. 

Information  from  sampling  and  record  data.  Most  of  the  variables  used  in  sample 
design  and  selection  are  too  detailed  to  be  in  the  basic-release  file  (see  the  later  section  on 
confidential  variables). 


Information  on  weighting.  Derivation  of  weights  is  discussed  in  detail  in  DMDC 
(2006a).  See  Appendix  K  for  examples  of  analyses  using  these  variables:11 


ELIGFLGW 

FINALWGT 

VSTRAT 

TOTAF 


Eligibility  Flag 

Final  Weight  with  Non-response  and  Postratification  Adjustments 
Variance  Estimation  Strata 

Weighting  Class  Strata  Totals  Based  on  Sampling  Frame  Counts 


Full  Survey  Dataset 

In  addition  to  variables  on  the  basic-survey  dataset,  the  full  survey  dataset  also  has  five 
additional  categories  of  variables:  (1)  the  raw  version  of  survey  items  that  appear  in  a  collapsed 
form  in  the  basic-release  section,  (2)  the  raw  version  of  key  demographic  variables  used  in 
analyses  that  appear  in  a  collapsed  fonn  in  the  basic-release  section;  (3)  detailed  variables 
created  by  the  operations  contractor  to  document  operations,  (4)  detailed  variables  used  in 
sampling,  and  (5)  detailed  variables  used  in  weighting.  Variables  are  grouped  in  these  categories 
in  Appendix  F,  G  and  H. 

Privacy  Act  confidential  variables — survey  data.  This  section  of  the  full  survey  dataset 
contains  the  original  survey  variables  that  had  a  recoded  version  in  the  basic-survey  dataset.  To 
the  extent  possible,  recoded  versions  of  these  variables  are  in  the  basic-release  file  section  under 
variables  constructed  for  analysis. 

Privacy  Act  confidential  variables — analysis  data.  This  section  of  the  full  survey 
dataset  contains  the  analytic  variables  constructed  by  DMDC.  To  the  extent  possible,  recoded 
versions  of  these  variables  are  in  the  basic-survey  dataset  section  under  variables  constructed  for 
analysis. 

Privacy  Act  confidential  variables — operations  data.  This  section  of  the  full  survey 
dataset  contains  operational  variables  created  by  the  operations  contractor.  These  variables  are 
useful  for  methodological  studies  and/or  were  used  in  determining  eligibility  and  response  status. 

The  identifying  variables  describe  how  the  record  was  processed  once  a  survey  was 
returned.  The  variables  BATCH,  SERIAF,  and  FITHO  uniquely  identify  each  returned  survey. 
FITHO  is  the  lithocode  scanned  from  the  survey.  BATCH  and  SERIAF  are  the  codes  printed  on 


11  Two  additional  variables  required  for  SUDAAN  are  on  the  dataset:  NPSTRAT,  poststratification  population 
counts;  and,  PSTRATA,  poststratification  strata. 


33 


the  survey  during  scanning  to  identify  the  scan  batch  number  and  scan  order  of  each  survey. 
These  numbers  can  be  used  to  retrieve  the  paper  copy  of  a  survey  for  a  short  time  after  it  has 
been  scanned  (e.g.,  should  researchers  want  to  check  electronically-stored  information  against 
the  respondent’s  answer  on  the  paper  survey).  DUPRET  and  DUPRET2  indicate  the  receipt  of 
multiple  returns.  DUPRET2  includes  blank  returns  in  the  multiple  counts;  DUPRET  excludes 
these  returns. 

The  classification  variables  describe  how  individual  sample  member’s  records  were 
grouped  and  indexed.  FALG  FIN  indicates  the  final  disposition  status  of  a  sample  member  (i.e., 
survey  returned,  blank  survey  returned,  not  locatable,  or  no  return).  Several  other  classification 
variables  were  used  to  categorize  a  survey’s  final  disposition.  These  variables  are:  BLKREAS, 
SCSINEL,  and  REFUSE.  BLKREAS  codes  the  reason  given  by  the  sample  member  for 
returning  a  blank  survey,  SCSINEL  indicates  the  reason  given  by  the  sample  member  for  being 
ineligible,  and  REFUSE  indicates  whether  a  sample  member  refused  to  complete  a  survey. 

Privacy  Act  confidential  variables — sampling  and  record  data.  This  section  of  the  full 
survey  dataset  contains  administrative  file  variables  and  constructed  variables  used  in 
determining  the  sampling  design.  It  also  includes  the  sampling  strata  identifiers  and  counts. 

Confidential  variables — weighting.  This  section  of  the  full  survey  dataset  contains 
variables  used  in  analysis  of  non-response  and  in  the  construction  of  the  weights. 

Using  Appendix  H 

Regardless  of  whether  analysts  use  all  or  only  portions  of  the  database,  all  analysts 
should  replicate  the  results  found  in  the  tables  in  H.  It  is  only  by  replicating  these  results  that 
analysts  can  be  sure  that  they  are  reading  the  data  correctly.  An  annotated  example  of  an  H  table 
is  listed  in  Figure  4.  (However,  table  does  not  reflect  actual  results.) 


34 


Figure  4. 

Annotated  Example  of  a  Table  from  G 

12006  Survey  of  Reserve  Component  Spouses 

Were  your  parent ( s ) /guardian ( s )  in  a  regular  Reserve  Component 
Service  and/or  National  Guard/Reserve? 

2SR015  3Were  your  parent ( s ) /guardian ( s )  in  a  regular  Reserve 

Component  Service  and/or  National  Guard/Reserve? 


FREQ6 

PERCENT7 

OS 

VALUE8 

SAS 

VALUE9 

MEANING10 

693 

1 . 9 

-9 

. 

No  response 

2 

o 

o 

-8 

.A 

Multiple  response  error 

23419 

65.0 

-1 

.B 

No  survey  return 

2135 

5.9 

1 

1 

Yes,  while  I  was  growing  up 

3089 

k D 

OO 

2 

2 

Yes,  but  only  before  I  was 
born 

6716 

18.7 

3 

3 

No 

36054 

100.1 

TOTALS11 

12  PERCENT  TOTAL  DOES  NOT  =  100  DUE  TO  ROUNDING  ERROR. 

13G-2 


SAS  DATA 


FORMAT  NAME 

TYPE 

LENGTH 

INFORMAT 

SR079 

NUM 

3 

STDOS2 

OS  DATA4 


COLS 

LENGTH 

NA-NA 

NA 

1 .  Codebook  title  and  item  text.  The  codebook  title  is  the  same  for  every  table  in 
Appendix  H  of  this  codebook.  It  lists  survey  name.  If  applicable,  the  indented  text 
under  the  title  presents  the  verbatim  Question  or  instructions  that  accompany  a  specific 
item  in  the  survey. 

2.  Variable  name.  The  variable  name  for  a  survey  item  is  up  to  eight  characters  in 
length  and  corresponds  to  the  variable  name  that  is  used  in  the  SAS(R-based,  basic- 
release  data  file.  The  conventions  for  naming  survey-derived  variables  are 
documented  in  Appendix  E.  Appendix  F  and  G  contains  a  full  listing  of  the  basic- 
release  file  variables,  as  well  as  short  descriptions  of  what  the  variables  document. 

3.  Survey  item  text.  For  survey  items,  this  text  is  the  verbatim  item  wording.  For  other 
variables,  this  text  provides  a  verbal  description  of  the  variable. 

4.  Location  of  the  item  on  the  OS  data  file.  This  block  provides  the  location  of  the 
variable  on  the  OS  data  file.  The  OS  data  block  documents  (a)  the  starting  and  ending 
column  numbers  where  the  data  are  stored  and  (b)  the  number  of  columns  that  the  data 
occupy. 


35 


5.  SAS  data  file  information.  This  block  indicates  format  name,  variable  type  (character 
or  number),  length  and  informat  of  the  data  in  the  SAS®  data  file.  The  last  block 
indicates  the  informat  appropriate  for  reading  the  data  from  the  OS  data  file. 

6.  Counts  of  item  value  responses.  This  column  indicates  the  number  of  sample 
members  who  fall  into  the  category  corresponding  to  each  value  for  the  variable.  The 
count  provided  for  each  variable  value  should  correspond  exactly  to  those  that  analysts 
would  obtain  when  running  unweighted  frequencies  on  all  36054  records  in  the 
accompanying  database.  Before  running  complex  statistical  analyses,  analysts  are 
encouraged  to  re-create  these  frequency  tables.  Re-creating  the  counts  minimally 
ensures  that  the  data  are  being  correctly  read  by  the  analysts’  computers  and  programs. 

7.  Respondent  percentages  for  each  value.  This  column  indicates  the  percentage  of 
sample  members  who  marked  each  variable  value.  The  percentages  are  calculated  by 
dividing  the  row  value  in  the  “FREQ”  column  by  the  total  listed  at  the  bottom  of  the 
“FREQ”  column.  The  percentages  provided  for  each  variable  value  should  correspond 
exactly  to  those  that  analysts  would  obtain  when  running  unweighted  frequencies  on 
all  36054  records  in  the  accompanying  database. 

8.  Response  OS  values.  This  column  presents  the  OS  (ASCII)  code  for  the  actual  or  re¬ 
coded  response  values  for  each  survey  item.  Further  details  on  the  values  in  this 
column  are  found  in  either  the  annotated  survey  form  or  in  E.  For  example,  all 
negative  values  are  found  in  Appendix  E. 

9.  Response  SAS®  values.  This  column  presents  the  SAS®  code  for  the  response  values 
for  each  variable.  Further  details  on  the  values  in  this  column  are  found  in  either  the 
annotated  survey  form  or  in  Appendix  E.  An  explanation  of  negative  values  is 
presented  in  Appendix  E. 

10.  Explanation  of  the  item  value  codes.  This  column  presents  brief  verbal  explanations 
of  the  OS  and  SAS®  coding  for  each  survey  item.  If  the  coded  infonnation 
corresponds  to  survey  response  alternatives,  the  text  in  the  table  is  the  verbatim 
response  from  the  survey  instrument.  More  detailed  explanations  are  presented  in  the 
annotated  survey  form  (Appendix  D)  and  in  Appendix  E. 

11.  Total  of  response  frequencies  and  percents.  The  number  appearing  at  the  bottom  of 
the  “FREQ”  column  is  the  total  number  of  sample  members  in  the  basic-release  file. 
This  number  is  the  same  for  every  table  in  this  codebook.  That  is,  every  sample 
member  in  the  database  is  accounted  for  on  every  variable  even  if  the  variable 
indicates  only  that  the  infonnation  was  missing  for  that  sample  member.  The  number 
appearing  at  the  bottom  of  the  “PERCENT”  column  is  typically  100.0.  Rounding 
enor,  however,  occasionally  causes  the  total  percentage  to  be  slightly  above  or  below 
100.0. 

12.  Messages  to  analysts.  The  messages  alert  analysts  to  situations  specific  to  a  variable 
including  (a)  rounding  errors  resulting  in  a  total  percentage  other  than  100  percent; 

(b)  the  variable  having  values  that  are  “too  numerous  to  list;”  (c)  extraction  of  the 


36 


variable  from  another  specified  database;  (d)  creation  of  the  variable  from  two  or  more 
variables  specified  in  the  message;  and  (e)  further  clarification  of  the  survey  item 
corresponding  to  the  variable. 

13.  Codebook  page  number.  This  is  the  H  page  number  corresponding  to  a  specific 
variable.  F  and  G  identifies  the  page  number  in  H  where  the  variable  can  be  found. 


37 


References 


Bureau  of  the  Census.  (2002,  March).  Current  Population  Survey:  Design  and  methodology 
(Technical  Paper  63RV).  Retrieved  May  31,  2002,  from 
http://www.census.gov/prod/2002pubs/tp63rv.pdf 

Chromy,  J.  R.  (1987).  Design  optimization  with  multiple  objectives.  Proceedings  of  the 
Section  on  Survey  Research  Methods,  194-199.  Alexandria,  VA:  American  Statistical 
Association. 

Cochran,  W.G.  (1977).  Sampling  techniques  (3rd  ed.).  New  York:  John  Wiley  &  Sons. 

Council  of  American  Survey  Research  Organizations.  (1982).  On  the  Definition  of  Response 
Rates  (special  report  of  the  CASRO  Task  Force  on  Completion  Rates,  Lester  R.  Frankel, 
Chair).  Port  Jefferson,  NY. 

Deever,  J.  A.,  &  Mason,  R.  E.  (2002).  DMDC  Sampling  Planning  Tool  (Version  2.0) 
[Computer  software].  Arlington,  VA:  DMDC. 

DMDC.  (2006a).  2006  Survey  of  Reserve  Component  Spouses:  Statistical  methodology  report 
(Report  No.  2002-031).  Arlington,  V A:  DMDC. 

DMDC.  (2006b).  2006  Survey  of  Reserve  Component  Spouses:  Tabulation  of  responses. 
(Report  No.  2006-029).  Arlington,  VA. 

OMB  Bulletin  No.  00-02.  (2000).  Guidance  on  aggregation  and  allocation  of  data  on  race  for 
use  in  civil  rights  monitoring  and  enforcement.  Washington,  DC:  U.S.  Office  of 
Management  and  Budget. 

S AS  Institute  Inc.  (2001).  SAS/STAT  User’s  Guide,  Version  8.  Cary,  NC. 

SPSS®  for  Windows™  [Computer  software].  (1993).  Chicago,  IL:  SPSS  Inc. 

Standards  for  Maintaining,  Collecting,  and  Presenting  Federal  Data  on  Race  and  Ethnicity,  62 
Fed.  Reg.  58781  (1997). 

SUDD  AN®  Software  for  the  Statistical  Analysis  of  Correlated  Data  [Computer  Software]. 
(1996).  Research  Triangle  Park,  NC:  Research  Triangle  Institute. 

Wolter,  K.M.  (1985).  Introduction  to  variance  estimation.  New  York:  Springer-Verlag. 


38 


REPORT  DOCUMENTATION  PAGE 


Form  Approved 
OMB  No.  0704-0188 


The  public  reporting  burden  for  this  collection  of  information  is  estimated  to  average  1  hour  per  response,  including  the  time  for  reviewing  instructions,  searching  existing  data  sources, 
gathering  and  maintaining  the  data  needed,  and  completing  and  reviewing  the  collection  of  information.  Send  comments  regarding  this  burden  estimate  or  any  other  aspect  of  this  collection 
of  information,  including  suggestions  for  reducing  the  burden,  to  Department  of  Defense,  Washington  Headquarters  Services,  Directorate  for  Information  Operations  and  Reports 
(0704-0188),  1215  Jefferson  Davis  Highway,  Suite  1204,  Arlington,  VA  22202-4302.  Respondents  should  be  aware  that  notwithstanding  any  other  provision  of  law,  no  person  shall  be 
subject  to  any  penalty  for  failing  to  comply  with  a  collection  of  information  if  it  does  not  display  a  currently  valid  OMB  control  number. 


|  PLEASE  DO  NOT  RETURN  YOUR  FORM  TO  THE  ABOVE  ADDRESS.  j 

1.  REPORT  DATE  (DD-MM-YYYY)  2.  REPORT  TYPE 

31-03-2007  Final  Report 

3.  DATES  COVERED  ( From  -  To) 

November  2005-June  2006 

4.  TITLE  AND  SUBTITLE 

2006  Survey  of  Reserve  Component  Spouses  -  Administration,  Datasets,  and 
Codebook 

5a.  CONTRACT  NUMBER 

5b.  GRANT  NUMBER 

5c.  PROGRAM  ELEMENT  NUMBER 

6.  AUTHOR(S) 

Defense  Manpower  Data  Center  (DMDC) 

5d.  PROJECT  NUMBER 

5e.  TASK  NUMBER 

5f.  WORK  UNIT  NUMBER 

7.  PERFORMING  ORGANIZATION  NAME(S)  AND  ADDRESS(ES) 

Defense  Manpower  Data  Center 

1600  Wilson  Boulevard,  Suite  400 

Arlington,  VA  22209-2593 

8.  PERFORMING  ORGANIZATION 

REPORT  NUMBER 

DMDC  Report  2006-030 

9.  SPONSORING/MONITORING  AGENCY  NAME(S)  AND  ADDRESS(ES) 

10.  SPONSOR/MONITOR’S  ACRONYM(S) 

11.  SPONSOR/MONITOR’S  REPORT 
NUMBER(S) 

12.  DISTRIBUTION/AVAILABILITY  STATEMENT 

Approved  for  Public  Release;  distribution  unlimited 

13.  SUPPLEMENTARY  NOTES 

Appendices  not  included.  Information  for  appendices  should  be  directed  to  Timothy  Elig,  DMDC  1600  Wilson  Blvd,  Arlington,  VA 
22209.  (703)696-5790. _ 

14.  ABSTRACT 

The  2006  Survey  of  Reserve  Component  Spouses  (2006  RCSS)  is  part  of  the  2006  Survey  of  Military  Spouses  project.  The  2006 
RCSS  was  designed  to  measure  the  attitudes  and  opinions  of  spouses  regarding  a  wide-range  of  issues  including  demographic 
information,  marital  history,  employment  status,  feelings  about  military  life,  use  of  military  programs  and  services,  and  number  and 
age  of  children  or  other  legal  dependents.  This  report  documents  how  the  survey  was  fielded  and  the  creation  of  reporting  variables. 


15.  SUBJECT  TERMS 

Survey,  Reserve  Component  Members,  Reserve  Component  Spouses,  Demographics,  Preparedness,  Children/Legal  Dependents, 
Health  and  Well  Being,  Employment  Status,  National  Guard/Reserves,  and  Programs  and  Services. 


|  16.  SECURITY  CLASSIFICATION  OF: 

17.  LIMITATION  OF 
ABSTRACT 

18.  NUMBER 
OF 

PAGES 

19a.  NAME  OF  RESPONSIBLE  PERSON 

a.  REPORT 

b.  ABSTRACT 

c.  THIS  PAGE 

Margaret  Emma  Holland 

U 

U 

U 

UU 

44 

19b.  TELEPHONE  NUMBER  (Include  area  code) 

(703)  696-4790 

Standard  Form  298  (Rev.  8/98) 


Prescribed  by  ANSI  Std.  Z39.18